Improving Precision of the Subspace Information Criterion
نویسنده
چکیده
Evaluating the generalization performance of learning machines without using additional test samples is one of the most important issues in the machine learning community. The subspace information criterion (SIC) is one of the methods for this purpose, which is shown to be an unbiased estimator of the generalization error with finite samples. Although the mean of SIC agrees with the true generalization error even in small sample cases, the scatter of SIC can be large under some severe conditions. In this paper, we therefore investigate the causes of degrading the precision of SIC, and discuss how its precision could be improved.
منابع مشابه
About Subspace-Frequently Hypercyclic Operators
In this paper, we introduce subspace-frequently hypercyclic operators. We show that these operators are subspace-hypercyclic and there are subspace-hypercyclic operators that are not subspace-frequently hypercyclic. There is a criterion like to subspace-hypercyclicity criterion that implies subspace-frequent hypercyclicity and if an operator $T$ satisfies this criterion, then $Toplus T$ is sub...
متن کاملSubspace-diskcyclic sequences of linear operators
A sequence ${T_n}_{n=1}^{infty}$ of bounded linear operators on a separable infinite dimensional Hilbert space $mathcal{H}$ is called subspace-diskcyclic with respect to the closed subspace $Msubseteq mathcal{H},$ if there exists a vector $xin mathcal{H}$ such that the disk-scaled orbit ${alpha T_n x: nin mathbb{N}, alpha inmathbb{C}, | alpha | leq 1}cap M$ is dense in $M$. The goal of t...
متن کاملSubspace system identification
We give a general overview of the state-of-the-art in subspace system identification methods. We have restricted ourselves to the most important ideas and developments since the methods appeared in the late eighties. First, the basis of linear subspace identification are summarized. Different algorithms one finds in literature (Such as N4SID, MOESP, CVA) are discussed and put into a unifyin...
متن کاملTrading Variance Reduction with Unbiasedness: The Regularized Subspace Information Criterion for Robust Model Selection in Kernel Regression
A well-known result by Stein (1956) shows that in particular situations, biased estimators can yield better parameter estimates than their generally preferred unbiased counterparts. This letter follows the same spirit, as we will stabilize the unbiased generalization error estimates by regularization and finally obtain more robust model selection criteria for learning. We trade a small bias aga...
متن کاملA Novel Noise Reduction Method Based on Subspace Division
This article presents a new subspace-based technique for reducing the noise of signals in time-series. In the proposed approach, the signal is initially represented as a data matrix. Then using Singular Value Decomposition (SVD), noisy data matrix is divided into signal subspace and noise subspace. In this subspace division, each derivative of the singular values with respect to rank order is u...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2003